Goto

Collaborating Authors

 linear averaging






Selecting Weighting Factors in Logarithmic Opinion Pools

Neural Information Processing Systems

A simple linear averaging of the outputs of several networks as e.g. in bagging [3], seems to follow naturally from a bias/variance decomposition of the sum-squared error. The sum-squared error of the average model is a quadratic function of the weighting factors assigned to the networks in the ensemble [7], suggesting a quadratic programming algorithm for finding the "optimal" weighting factors. If we interpret the output of a network as a probability statement, the sum-squared error corresponds to minus the loglikelihood or the Kullback-Leibler divergence, and linear averaging of the out(cid:173) puts to logarithmic averaging of the probability statements: the logarithmic opinion pool. The crux of this paper is that this whole story about model aver(cid:173) aging, bias/variance decompositions, and quadratic programming to find the optimal weighting factors, is not specific for the sum(cid:173) squared error, but applies to the combination of probability state(cid:173) ments of any kind in a logarithmic opinion pool, as long as the Kullback-Leibler divergence plays the role of the error measure. As examples we treat model averaging for classification models under a cross-entropy error measure and models for estimating variances.


Scalable Algorithms for Aggregating Disparate Forecasts of Probability

Predd, Joel B., Kulkarni, Sanjeev R., Osherson, Daniel N., Poor, H. Vincent

arXiv.org Artificial Intelligence

In this paper, computational aspects of the panel aggregation problem are addressed. Motivated primarily by applications of risk assessment, an algorithm is developed for aggregating large corpora of internally incoherent probability assessments. The algorithm is characterized by a provable performance guarantee, and is demonstrated to be orders of magnitude faster than existing tools when tested on several real-world data-sets. In addition, unexpected connections between research in risk assessment and wireless sensor networks are exposed, as several key ideas are illustrated to be useful in both fields.


Selecting Weighting Factors in Logarithmic Opinion Pools

Heskes, Tom

Neural Information Processing Systems

A simple linear averaging of the outputs of several networks as e.g. in bagging [3], seems to follow naturally from a bias/variance decomposition of the sum-squared error. The sum-squared error of the average model is a quadratic function of the weighting factors assigned to the networks in the ensemble [7], suggesting a quadratic programming algorithm for finding the "optimal" weighting factors. If we interpret the output of a network as a probability statement, the sum-squared error corresponds to minus the loglikelihood or the Kullback-Leibler divergence, and linear averaging of the outputs to logarithmic averaging of the probability statements: the logarithmic opinion pool. The crux of this paper is that this whole story about model averaging, bias/variance decompositions, and quadratic programming to find the optimal weighting factors, is not specific for the sumsquared error, but applies to the combination of probability statements of any kind in a logarithmic opinion pool, as long as the Kullback-Leibler divergence plays the role of the error measure. As examples we treat model averaging for classification models under a cross-entropy error measure and models for estimating variances.


Selecting Weighting Factors in Logarithmic Opinion Pools

Heskes, Tom

Neural Information Processing Systems

A simple linear averaging of the outputs of several networks as e.g. in bagging [3], seems to follow naturally from a bias/variance decomposition of the sum-squared error. The sum-squared error of the average model is a quadratic function of the weighting factors assigned to the networks in the ensemble [7], suggesting a quadratic programming algorithm for finding the "optimal" weighting factors. If we interpret the output of a network as a probability statement, the sum-squared error corresponds to minus the loglikelihood or the Kullback-Leibler divergence, and linear averaging of the outputs to logarithmic averaging of the probability statements: the logarithmic opinion pool. The crux of this paper is that this whole story about model averaging, bias/variance decompositions, and quadratic programming to find the optimal weighting factors, is not specific for the sumsquared error, but applies to the combination of probability statements of any kind in a logarithmic opinion pool, as long as the Kullback-Leibler divergence plays the role of the error measure. As examples we treat model averaging for classification models under a cross-entropy error measure and models for estimating variances.


Selecting Weighting Factors in Logarithmic Opinion Pools

Heskes, Tom

Neural Information Processing Systems

A simple linear averaging of the outputs of several networks as e.g. in bagging [3], seems to follow naturally from a bias/variance decomposition of the sum-squared error. The sum-squared error of the average model is a quadratic function of the weighting factors assigned to the networks in the ensemble [7], suggesting a quadratic programming algorithm for finding the "optimal" weighting factors. If we interpret the output of a network as a probability statement, the sum-squared error corresponds to minus the loglikelihood or the Kullback-Leibler divergence, and linear averaging of the outputs tologarithmic averaging of the probability statements: the logarithmic opinion pool. The crux of this paper is that this whole story about model averaging, bias/variancedecompositions, and quadratic programming to find the optimal weighting factors, is not specific for the sumsquared error,but applies to the combination of probability statements of any kind in a logarithmic opinion pool, as long as the Kullback-Leibler divergence plays the role of the error measure. As examples we treat model averaging for classification models under a cross-entropy error measure and models for estimating variances.